557 research outputs found
Towards Verifiably Ethical Robot Behaviour
Ensuring that autonomous systems work ethically is both complex and
difficult. However, the idea of having an additional `governor' that assesses
options the system has, and prunes them to select the most ethical choices is
well understood. Recent work has produced such a governor consisting of a
`consequence engine' that assesses the likely future outcomes of actions then
applies a Safety/Ethical logic to select actions. Although this is appealing,
it is impossible to be certain that the most ethical options are actually
taken. In this paper we extend and apply a well-known agent verification
approach to our consequence engine, allowing us to verify the correctness of
its ethical decision-making.Comment: Presented at the 1st International Workshop on AI and Ethics, Sunday
25th January 2015, Hill Country A, Hyatt Regency Austin. Will appear in the
workshop proceedings published by AAA
Soft-decision minimum-distance sequential decoding algorithm for convolutional codes
The maximum-likelihood decoding of convolutional codes has generally been considered impractical for other than relatively short constraint length codes, because of the exponential growth in complexity with increasing constraint length. The soft-decision minimum-distance decoding algorithm proposed in the paper approaches the performance of a maximum-likelihood decoder, and uses a sequential decoding approach to avoid an exponential growth in complexity. The algorithm also utilises the distance and structural properties of convolutional codes to considerably reduce the amount of searching needed to find the minimum soft-decision distance paths when a back-up search is required. This is done in two main ways. First, a small set of paths called permissible paths are utilised to search the whole of the subtree for the better path, instead of using all the paths within a given subtree. Secondly, the decoder identifies which subset of permissible paths should be utilised in a given search and which may be ignored. In this way many unnecessary path searches are completely eliminated. Because the decoding effort required by the algorithm is low, and the decoding processes are simple, the algorithm opens the possibility of building high-speed long constraint length convolutional decoders whose performance approaches that of the optimum maximum-likelihood decoder. The paper describes the algorithm and its theoretical basis, and gives examples of its operation. Also, results obtained from practical implementations of the algorithm using a high-speed microcomputer are presented
Modelling a wireless connected swarm of mobile robots
It is a characteristic of swarm robotics that modelling the overall swarm behaviour in terms of the low-level behaviours of individual robots is very difficult. Yet if swarm robotics is to make the transition from the laboratory to real-world engineering realisation such models would be critical for both overall validation of algorithm correctness and detailed parameter optimisation. We seek models with predictive power: models that allow us to determine the effect of modifying parameters in individual robots on the overall swarm behaviour. This paper presents results from a study to apply the probabilistic modelling approach to a class of wireless connected swarms operating in unbounded environments. The paper proposes a probabilistic finite state machine (PFSM) that describes the network connectivity and overall macroscopic behaviour of the swarm, then develops a novel robot-centric approach to the estimation of the state transition probabilities within the PFSM. Using measured data from simulation the paper then carefully validates the PFSM model step by step, allowing us to assess the accuracy and hence the utility of the model. © Springer Science + Business Media, LLC 2008
Safety in Numbers: Fault Tolerance in Robot Swarms
The swarm intelligence literature frequently asserts that swarms exhibit high levels of robustness. That claim is, however, rather less frequently supported by empirical or theoretical analysis. But what do we mean by a 'robust' swarm? How would we measure the robustness or – to put it another way – fault-tolerance of a robotic swarm? These questions are not just of academic interest. If swarm robotics is to make the transition from the laboratory to real-world engineering implementation, we would need to be able to address these questions in a way that would satisfy the needs of the world of safety certification. This paper explores fault-tolerance in robot swarms through Failure Mode and Effect Analysis (FMEA) and reliability modelling. The work of this paper is illustrated by a case study of a wireless connected robot swarm, employing both simulation and real-robot laboratory experiments
Soft-decision minimum-distance sequential decoding algorithm for convolutional codes
The maximum-likelihood decoding of convolutional codes has generally been considered impractical for other than relatively short constraint length codes, because of the exponential growth in complexity with increasing constraint length. The soft-decision minimum-distance decoding algorithm proposed in the paper approaches the performance of a maximum-likelihood decoder, and uses a sequential decoding approach to avoid an exponential growth in complexity. The algorithm also utilises the distance and structural properties of convolutional codes to considerably reduce the amount of searching needed to find the minimum soft-decision distance paths when a back-up search is required. This is done in two main ways. First, a small set of paths called permissible paths are utilised to search the whole of the subtree for the better path, instead of using all the paths within a given subtree. Secondly, the decoder identifies which subset of permissible paths should be utilised in a given search and which may be ignored. In this way many unnecessary path searches are completely eliminated. Because the decoding effort required by the algorithm is low, and the decoding processes are simple, the algorithm opens the possibility of building high-speed long constraint length convolutional decoders whose performance approaches that of the optimum maximum-likelihood decoder. The paper describes the algorithm and its theoretical basis, and gives examples of its operation. Also, results obtained from practical implementations of the algorithm using a high-speed microcomputer are presented
On embodied memetic evolution and the emergence of behavioural traditions in Robots
This paper describes ideas and initial experiments in embodied imitation using e-puck robots, developed as part of a project whose aim is to demonstrate the emergence of artificial culture in collective robot systems. Imitated behaviours (memes) will undergo variation because of the noise and heterogeneities of the robots and their sensors. Robots can select which memes to enact, and-because we have a multi-robot collective-memes are able to undergo multiple cycles of imitation, with inherited characteristics. We thus have the three evolutionary operators: variation, selection and inheritance, and-as we describe in this paper-experimental trials show that we are able to demonstrate embodied movement-meme evolution. © 2011 Springer-Verlag
The solvation and dissociation of 4-benzylaniline hydrochloride in chlorobenzene
A reaction scheme is proposed to account for the liberation of 4-benzylaniline from 4-benzylaniline hydrochloride, using chlorobenzene as a solvent at a temperature of 373 K. Two operational regimes are explored: “closed” reaction conditions correspond to the retention of evolved hydrogen chloride gas within the reaction medium, whereas an “open” system permits gaseous hydrogen chloride to be released from the reaction medium. The solution phase chemistry is analyzed by 1H NMR spectroscopy. Complete liberation of solvated 4-benzylaniline from solid 4-benzylaniline hydrochloride is possible under “open” conditions, with the entropically favored conversion of solvated hydrogen chloride to the gaseous phase thought to be the thermodynamic driver that effectively controls a series of interconnecting equilibria. A kinetic model is proposed to account for the observations of the open system
Coulomb and nuclear breakup effects in the single neutron removal reaction 197Au(17C,16C gamma)X
We analyze the recently obtained new data on the partial cross sections and
parallel momentum distributions for transitions to ground as well as excited
states of the 16C core, in the one-neutron removal reaction 197Au(17C,16C
gamma)X at the beam energy of 61 MeV/nucleon. The Coulomb and nuclear breakup
components of the one-neutron removal cross sections have been calculated
within a finite range distorted wave Born approximation theory and an eikonal
model, respectively. The nuclear contributions dominate the partial cross
sections for the core excited states. By adding the nuclear and Coulomb cross
sections together, a reasonable agreement is obtained with the data for these
states. The shapes of the experimental parallel momentum distributions of the
core states are described well by the theory.Comment: Revtex format, two figures included, to appear in Phys. Rev. C.
(Rapid communications
- …